|
In probability theory and statistics, a central moment is a moment of a probability distribution of a random variable about the random variable's mean. The rth moment about any point a is called a central moment; it is the expected value of a specified integer power of the deviation of the random variable from the mean. The various moments form one set of values by which the properties of a probability distribution can be usefully characterised. Central moments are used in preference to ordinary moments, computed in terms of deviations from the mean instead of from the zero, because the higher-order central moments relate only to the spread and shape of the distribution, rather than also to its location. Sets of central moments can be defined for both univariate and multivariate distributions. ==Univariate moments== The ''n''th moment about the mean (or ''n''th central moment) of a real-valued random variable ''X'' is the quantity μ''n'' := E, where E is the expectation operator. For a continuous univariate probability distribution with probability density function ''f''(''x''), the ''n''th moment about the mean μ is : For random variables that have no mean, such as the Cauchy distribution, central moments are not defined. The first few central moments have intuitive interpretations: * The "zeroth" central moment μ0 is 1. * The first central moment μ1 is 0 (not to be confused with the first moment itself, the expected value or mean). * The second central moment μ2 is called the variance, and is usually denoted σ2, where σ represents the standard deviation. * The third and fourth central moments are used to define the standardized moments which are used to define skewness and kurtosis, respectively. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Central moment」の詳細全文を読む スポンサード リンク
|